📚 node [[batch|batch]]
Welcome! Nobody has contributed anything to 'batch|batch' yet. You can:
  • Write something in the document below!
    • There is at least one public document in every node in the Agora. Whatever you write in it will be integrated and made available for the next visitor to read and edit.
  • Write to the Agora from social media.
    • If you follow Agora bot on a supported platform and include the wikilink [[batch|batch]] in a post, the Agora will link it here and optionally integrate your writing.
  • Sign up as a full Agora user.
    • As a full user you will be able to contribute your personal notes and resources directly to this knowledge commons. Some setup required :)
⥅ related node [[2010 05 12 tungle and batchbook now integrated]]
⥅ related node [[2010 06 08 batchbook user group and intro june 15th 2010]]
⥅ related node [[stephen batchelor]]
⥅ related node [[batch]]
⥅ related node [[batch_normalization]]
⥅ related node [[batch_size]]
⥅ related node [[mini batch]]
⥅ related node [[mini batch_stochastic_gradient_descent_(sgd)]]
⥅ node [[batch]] pulled by Agora

batch

Go back to the [[AI Glossary]]

The set of examples used in one iteration (that is, one gradient update) of model training.

See also batch size.

⥅ node [[batch_normalization]] pulled by Agora

batch normalization

Go back to the [[AI Glossary]]

Normalizing the input or output of the activation functions in a hidden layer. Batch normalization can provide the following benefits:

Make neural networks more stable by protecting against outlier weights.
Enable higher learning rates.
Reduce overfitting.
⥅ node [[batch_size]] pulled by Agora

batch size

Go back to the [[AI Glossary]]

The number of examples in a batch. For example, the batch size of SGD is 1, while the batch size of a mini-batch is usually between 10 and 1000. Batch size is usually fixed during training and inference; however, TensorFlow does permit dynamic batch sizes.

📖 stoas
⥱ context